Latent Variable Models for Dimensionality Reduction

نویسندگان

  • Zhihua Zhang
  • Michael I. Jordan
چکیده

Principal coordinate analysis (PCO), as a duality of principal component analysis (PCA), is also a classical method for exploratory data analysis. In this paper we propose a probabilistic PCO by using a normal latent variable model in which maximum likelihood estimation and an expectation-maximization algorithm are respectively devised to calculate the configurations of objects in a lowdimensional Euclidean space. We also devise probabilistic formulations for kernel PCA which is a nonlinear extension of PCA.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Description of current research

This document describes the research I have done since October 1995 for my PhD thesis at the Dept. of Computer Science, University of Sheffield, U.K., which I expect to complete by October/November 1999. My thesis research has involved two generic fields of machine learning: dimensionality reduction and sequential data reconstruction, which I have approached from the common point of view of lat...

متن کامل

Gaussian Process Latent Variable Models for Dimensionality Reduction and Time Series Modeling

Time series data of high dimensions are frequently encountered in fields like robotics, computer vision, economics and motion capture. In this survey paper we look first at Gaussian Process Latent Variable Model (GPLVM) which is a probabilistic nonlinear dimensionality reduction method. Further we discuss Gaussian Process Dynamical Model (GPDMs) which are based GPLVM. GPDM is a probabilistic ap...

متن کامل

Bayesian Gaussian Process Latent Variable Model

We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variation...

متن کامل

Shared Gaussian Process Latent Variables Models

A fundamental task is machine learning is modeling the relationship between different observation spaces. Dimensionality reduction is the task reducing the number of dimensions in a parameterization of a data-set. In this thesis we are interested in the cross-road between these two tasks: shared dimensionality reduction. Shared dimensionality reduction aims to represent multiple observation spa...

متن کامل

Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of res...

متن کامل

Dimensionality reduction of electropalatographic data using latent variable models

We consider the problem of obtaining a reduced dimension representation of electropalatographic (EPG) data. An unsupervised learning approach based on latent variable modelling is adopted, in which an underlying lower dimension representation is inferred directly from the data. Several latent variable models are investigated, including factor analysis and the generative topographic mapping (GTM...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009